|
Switching circuit theory is the mathematical study of the properties of networks of idealized switches. Such networks may be strictly combinational logic, in which their output state is only a function of the present state of their inputs; or may also contain sequential elements, where the present state depends on the present state and past states; in that sense, sequential circuits are said to include "memory" of past states. An important class of sequential circuits are state machines. Switching circuit theory is applicable to the design of telephone systems, computers, and similar systems. In the paper A Symbolic Analysis of Relay and Switching Circuits of 1938, Claude Shannon showed that the two-valued Boolean algebra can describe the operation of switching circuits. The principles of Boolean algebra are applied to switches, providing mathematical tools for analysis and synthesis of any switching system. Ideal switches are considered as having only two exclusive states, for example, open or closed. In some analysis, the state of a switch can be considered to have no influence on the output of the system and is designated as a "don't care" state. In complex networks it is necessary to also account for the finite switching time of physical switches; where two or more different paths in a network may affect the output, these delays may result in a "logic hazard" or "race condition" where the output state changes due to the different propagation times through the network. ==See also== *Karnaugh map *Boolean circuit *C-element *Circuit minimization *Circuit complexity *Circuit switching *Logic design *Logic in computer science *Logic gate *Nonblocking minimal spanning switch *Quine–McCluskey algorithm *Relay - the kind of logic device Shannon was concerned with in 1938 *Programmable logic controller - computer software mimics relay circuits for industrial applications *Switching lemma *Unate function 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Switching circuit theory」の詳細全文を読む スポンサード リンク
|